141 research outputs found

    A Chemistry-Inspired Framework for Achieving Consensus in Wireless Sensor Networks

    Full text link
    The aim of this paper is to show how simple interaction mechanisms, inspired by chemical systems, can provide the basic tools to design and analyze a mathematical model for achieving consensus in wireless sensor networks, characterized by balanced directed graphs. The convergence and stability of the model are first proven by using new mathematical tools, which are borrowed directly from chemical theory, and then validated by means of simulation results, for different network topologies and number of sensors. The underlying chemical theory is also used to derive simple interaction rules that may account for practical issues, such as the estimation of the number of neighbors and the robustness against perturbations. Finally, the proposed chemical solution is validated under real-world conditions by means of a four-node hardware implementation where the exchange of information among nodes takes place in a distributed manner (with no need for any admission control and synchronism procedure), simply relying on the transmission of a pulse whose rate is proportional to the state of each sensor.Comment: 12 pages, 10 figures, submitted to IEEE Sensors Journa

    Self-Healing Protocol Implementations

    Get PDF
    Current studies on self-configuring and adaptive networks aim at developing specific and fixed protocols which are able to optimize their configuration in a variable network environment. In this talk we study the problem where the protocols need to cope with a defective execution, including the lossy execution or the injection of foreign code. One guiding question will be the creation of robust execution circuits which can distribute over a network and which continue their service despite parts of the implementation being knocked out. The ultimate goal is to enable protocol implementations to detect by themselves that they are malfunctioning and to let them correct their own operation mode and code base. As a show case, we present a protocol implementation which is robust against deletion (knock-out) of any single instruction, regardless whether this deletion affects the core protocol functionality or the resilience logic. The technique used in this first of its kind example is the self-modification of the running program, which can be naturally situated in an active networking context. Ultimately, a self-correcting protocol implementation has to constantly rewrite itself according to the (self-)observed performance. In this talk we will also point to related fields like self-correcting software, fault tolerant quantum computing and self-healing properties of biological systems. This is joint work with Lidia Yamamoto, Hitachi Europe

    Oblivious Homomorphic Encryption

    Get PDF
    In this paper, we introduce Oblivious Homomorphic Encryption (OHE) which provably separates the computation spaces of multiple clients of a fully homomorphic encryption (FHE) service while keeping the evaluator blind about whom a result belongs. We justify the importance of this strict isolation property of OHE by showing an attack on a recently proposed key-private cryptocurrency scheme. Our two OHE constructions are based on a puncturing function where the evaluator can effectively mask ciphertexts from rogue and potentially colluding clients. In the first construction OHE1, we show that this can be im- plemented via an FHE scheme (with key privacy and weak wrong-key decryption properties) plus an anonymous commitment scheme. The second construction OHE2, for flexibility of primitive choice, achieves this via a combination of a standard FHE scheme, an encryption scheme with key privacy and weak wrong-key decryption, and an anonymous commitment scheme. OHE can be used to provide provable anonymity to cloud applications, single server implementations of anonymous messaging as well as account-based cryptocurrencies

    Young women with breast cancer: how many are actually candidates for fertility preservation?

    Get PDF
    Purpose: There are no data regarding the actual need for fertility preservation (FP) in breast cancer (BC) patients. Our study provides a practical needs assessment for reproductive medicine by analyzing an unselected cohort of young BC patients. This assessment considers oncological factors as well as the patient's obstetrical and gynecological history and reproductive outcome after BC diagnosis. We aimed to identify how many patients are actually potential candidates for FP and how many patients might consequently use their cryopreserved gametes to achieve pregnancy. Methods: Based on a prospective BC database, we analyzed all patients who were≤40years at initial diagnosis (time period of diagnosis: 1990-2007; n=100; 7.7% of the entire BC cohort; median age: 35.9years). Results: Using an algorithm of exclusion criteria considering disease-specific, therapy-specific and family history characteristics, 36 patients who received chemotherapy were identified as potential "classical” candidates for FP. After 5years, 22 women were identified as potential candidates for using their cryopreserved gametes to achieve pregnancy; the majority of these patients were childless (n=16, 72.7%) and in their late reproductive years (n=12, 54.5%). Conclusions: Our study demonstrates that in a cohort of young BC patients only a minority of women are candidates for FP. Young BC patients who wish to have children in the future usually carry risk factors both from oncological and reproductive medicine perspective. Due to this high-risk profile, the rarity of BC in young age and the limited number of patients who might actually have opted for FP, these women must be offered timely and multidisciplinary counseling in highly specialized center

    Effect of COVID-19 on acute treatment of ST-segment elevation and Non-ST-segment elevation acute coronary syndrome in northwestern Switzerland

    Get PDF
    To investigate the effect of the corona virus disease 2019 (COVID-19) pandemic on the acute treatment of patients with ST-segment elevation (STEMI) and Non-ST-segment elevation acute coronary syndrome (NSTE-ACS).; We retrospectively identified patients presenting to the emergency department (ED) with suspected ACS. We evaluated the number of percutaneous coronary interventions (PCIs) for STEMI, NSTE-ACS, and elective PCI cases. In STEMI patients, we assessed the time from chest pain onset (cpo) to ED presentation, post-infarction left ventricular ejection fraction (LVEF), and time from ED presentation to PCI. We directly compared cases from two time intervals: January/February 2020 versus March/April 2020 (defined as 2 months before and after the COVID-19 outbreak). In a secondary analysis, we directly compared cases from March/April 2020 with patients from the same time interval in 2019.; From January to April 2020, 765 patients presented with acute chest pain to the ED. A dramatic reduction of ED presentations after compared to before the COVID-19 outbreak (31% relative reduction) was observed. Overall, 398 PCIs were performed, 220/398 PCIs (55.3%) before versus 178/398 PCIs (44.7%) after the outbreak. While numbers for NSTE-ACS and elective interventions declined by 21% and 31%, respectively, the number of STEMI cases remained stable. Time from cpo to ED presentation, post-infarction LVEF, and median door-to-balloon time remained unchanged.; In contrast to previous reports, our findings do not confirm the dramatic drop in STEMI cases and interventions in northwestern Switzerland as observed in other regions and hospitals around the world

    A decade of detailed observations (2008-2018) in steep bedrock permafrost at the Matterhorn Hörnligrat (Zermatt, CH)

    Get PDF
    The PermaSense project is an ongoing interdisciplinary effort between geo-science and engineering disciplines and started in 2006 with the goals of realizing observations that previously have not been possible. Specifically, the aims are to obtain measurements in unprecedented quantity and quality based on technological advances. This paper describes a unique >10-year data record obtained from in situ measurements in steep bedrock permafrost in an Alpine environment on the Matterhorn Hörnligrat, Zermatt, Switzerland, at 3500ma:s:l. Through the utilization of state-of-the-art wireless sensor technology it was possible to obtain more data of higher quality, make these data available in near real time and tightly monitor and control the running experiments. This data set (https://doi.org/10.1594/PANGAEA.897640,Weber et al., 2019a) constitutes the longest, densest and most diverse data record in the history of mountain permafrost research worldwide with 17 different sensor types used at 29 distinct sensor locations consisting of over 114.5 million data points captured over a period of 10 or more years. By documenting and sharing these data in this form we contribute to making our past research reproducible and facilitate future research based on these data, e.g., in the areas of analysis methodology, comparative studies, assessment of change in the environment, natural hazard warning and the development of process models. Finally, the cross-validation of four different data types clearly indicates the dominance of thawing-related kinematics

    On the Structuring of Computer Communications

    No full text
    In the foreground of this work stands the question if communication should be understood as an interpretation process or as an instructional phenomenon. The interpretative view of communication is the one advocated by Shannon's classical communication model of 1949. This (technical) model has even penetrated the human sciences and is the base of todays computer communications. It's main characteristic is the introduction of a coder/decoder pair separate from the information source and destination - a view which we call the PDU paradigm. In terms of computer communications, the coder/decoder entities are known as protocol stacks which exchange highly structured protocol data units (PDU). All a receiving protocol stack has to do is to analyse each received PDU and, according to the data values found, react by either changing an internal state, by generating a new PDU or by delivering a 'decoded' message to the destination entity. The new paradigm, called messenger paradigm, replaces the exchange of data values by the exchange of instructions. Such groups of instructions, named messengers, are no longer analysed when received: they simply have to be executed. Part 1 of this work examines the messenger paradigm from a rather philosophical viewpoint by using the conceptual framework of semiotics, but also by relating the messenger paradigm with the virus mechanism found in biology and informatics. Part 2 applies the messenger paradigm to some protocols of computer communications and introduces the key elements of a first execution environment for messengers. It is shown how Stenning's sliding window protocol can be reformulated within the messenger context, thus leading to an implementation of this protocol which does no longer require preinstalled protocol entities. Unexpected theoretical problems become manifest when one tries to relate a messenger-based protocol realisation with its related ordinary PDU-based form. One of the conclusions of this work is that there may be protocols which can be expressed in terms of messengers but which have no equivalent representation under the PDU-paradigm. Part 3 finally investigates the consequences of applying the messenger paradigm to computer communications in general. Messengers offer in principle complete liberty for the structuring of computer communications. However, in order to be useful, some common structures (i.e. conventions) are necessary. The problems of setting up a meta-communication architecture are then discussed whereas it seems that new insights may come from extending the principle of computational reflection to a distributed environment, but also from existing theories of organisation, that is, cybernetics and sociology. More research is needed in order to better understand the implications of the messenger paradigm not only for computer communications, but also for a general theory of communication
    corecore